An Efficient and Safe Framework for Solving Optimization Problems

نویسندگان

  • Yahia Lebbah
  • Claude Michel
  • Michel Rueher
چکیده

Interval methods have shown their ability to locate and prove the existence of a global optima in a safe and rigorous way. Unfortunately, these methods are rather slow. Efficient solvers for optimization problems are based on linear relaxations. However, the latter are unsafe, and thus may overestimate, or worst, underestimate the very global minima. This paper introducesQuadOpt, an efficient and safe framework to rigorously bound the global optima as well as its location. QuadOpt uses consistency techniques to speed up the initial convergence of the interval narrowing algorithms. A lower bound is computed on a linear relaxation of the constraint system and the objective function. All these computations are based on a safe and rigorous implementation of linear programming techniques. First experimental results are very promising.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

INTRODUCTION AND DEVELOPMENT OF SURROGATE MANAGEMENT FRAMEWORK FOR SOLVING OPTIMIZATION PROBLEMS

In this paper, we have outlined the surrogate management framework for optimization of expensive functions. An initial simple iterative method which we call the “Strawman” method illustrates how surrogates can be incorporated into optimization to stand in for the most expensive function. These ideas are made rigorous by incorporating them into the framework of pattern search methods. The SMF al...

متن کامل

An Efficient Neurodynamic Scheme for Solving a Class of Nonconvex Nonlinear Optimization Problems

‎By p-power (or partial p-power) transformation‎, ‎the Lagrangian function in nonconvex optimization problem becomes locally convex‎. ‎In this paper‎, ‎we present a neural network based on an NCP function for solving the nonconvex optimization problem‎. An important feature of this neural network is the one-to-one correspondence between its equilibria and KKT points of the nonconvex optimizatio...

متن کامل

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

An efficient modified neural network for solving nonlinear programming problems with hybrid constraints

This paper presents ‎‎the optimization techniques for solving‎‎ convex programming problems with hybrid constraints‎.‎ According to the saddle point theorem‎, ‎optimization theory‎, ‎convex analysis theory‎, ‎Lyapunov stability theory and LaSalle‎‎invariance principle‎,‎ a neural network model is constructed‎.‎ The equilibrium point of the proposed model is proved to be equivalent to the optima...

متن کامل

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012